# FILE NAME: 00001332.soc # TITLE: Should everyone be granted the legal right to know the algorithms used to analyse their online behaviour? [759b3d52094ad378cc333ed1594e8fb0] # DESCRIPTION: # DATA TYPE: soc # MODIFICATION TYPE: original # RELATES TO: # RELATED FILES: # PUBLICATION DATE: 2025-10-12 # MODIFICATION DATE: 2025-10-12 # NUMBER ALTERNATIVES: 4 # NUMBER VOTERS: 5 # NUMBER UNIQUE ORDERS: 4 # ALTERNATIVE NAME 1: Statement 1 - In general, people should have the legal right to know the algorithms used to analyse their online behaviour. The algorithms should be available for scrutiny and challenge if necessary. This would help to ensure that the algorithms are not being used in a way which is detrimental to the individual. This would also help to ensure that the algorithms are not being used to perpetuate bias and stop personal choice. However, there are some circumstances in which the algorithms should not be made available to the individual. This would be the case if the algorithms were being used to protect the individual, for example to protect them from being a victim of online crime. # ALTERNATIVE NAME 2: Statement 2 - In general, the algorithms used to analyse online behaviour should be accessible to the individual and not kept as a secret. This right is not automatic and it is not automatic that the person has a right to know the details of the algorithm itself. However, people should be allowed to make a request for the details of the algorithm used and how it is applied. If the request is lawful, then the algorithm must be revealed. There are some circumstances where the algorithms used will not be released, for example when national security or personal safety is involved. # ALTERNATIVE NAME 3: Statement 3 - In general, the algorithms used to analyse online behaviour should be available for people to see. This would help to ensure that people are aware of what is happening and can make their own choices about what is happening to their data. However, there may be some cases where this is not appropriate, such as algorithms used to detect crime or terrorism. # ALTERNATIVE NAME 4: Statement 4 - People should have the legal right to know the algorithms used to analyse their online behaviour, because these are not secret data, they are being used to analyse their behaviour. There is a fear of machines replacing humans. The algorithm should be able to explain itself in order to show that it is not biased. 2: 1,2,3,4 1: 3,2,1,4 1: 1,3,4,2 1: 2,1,3,4